Dictator functions maximize mutual information
نویسندگان
چکیده
منابع مشابه
Two Dictator Functions Maximize Mutual Information
Let (X,Y) denote n independent, identically distributed copies of two arbitrarily correlated Rademacher random variables (X,Y) on {−1, 1}. We prove that the inequality I (f(X); g(Y)) ≤ I (X;Y) holds for any two Boolean functions: f, g : {−1, 1} → {−1, 1} (I ( · ; ·) denotes mutual information.) We further show that equality in general is achieved only by the dictator functions: f(x) = ±g(x) = ±...
متن کاملCanalizing Boolean Functions Maximize the Mutual Information
The ability of information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well known mutual information. Using Fourier analysis we show that canalizing functions maximize the mutual information between an input variable and the outcome of the function. We proof our result for Boolean func...
متن کاملHard Clusters Maximize Mutual Information
In this paper, we investigate mutual information as a cost function for clustering, and show in which cases hard, i.e., deterministic, clusters are optimal. Using convexity properties of mutual information, we show that certain formulations of the information bottleneck problem are solved by hard clusters. Similarly, hard clusters are optimal for the information-theoretic co-clustering problem ...
متن کاملComments and Corrections Comments on “Canalizing Boolean Functions Maximize Mutual Information”
In their recent paper “Canalizing Boolean Functions Maximize Mutual Information,” Klotz et al. argued that canalizing Boolean functions maximize certain mutual informations by an argument involving Fourier analysis on the hypercube. This note supplies short new proofs of their results based on a coupling argument and also clarifies a point on the necessity of considering randomized functions.
متن کاملMutual Information Functions Versus Correlation Functions
This paper studies one application of mutual information to symbolic sequence: the mutual information function M(d). This function is compared with the more frequently used correlation function (d). An exact relation between M(d) and (d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, (d) = 0 may or may not lead to M(d) ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: The Annals of Applied Probability
سال: 2018
ISSN: 1050-5164
DOI: 10.1214/18-aap1384